National Repository of Grey Literature 9 records found  Search took 0.01 seconds. 
Analysis of Mobile Devices Network Communication Data
Abraham, Lukáš ; Bartík, Vladimír (referee) ; Burgetová, Ivana (advisor)
At the beginning, the work describes DNS and SSL/TLS protocols, it mainly deals with communication between devices using these protocols. Then we'll talk about data preprocessing and data cleaning. Furthermore, the thesis deals with basic data mining techniques such as data classification, association rules, information retrieval, regression analysis and cluster analysis. The next chapter we can read something about how to identify mobile devices on the network. We will evaluate data sets that contain collected data from communication between the above mentioned protocols, which will be used in the practical part. After that, we finally get to the design of a system for analyzing network communication data. We will describe the libraries, which we used and the entire system implementation. We will perform a large number of experiments, which we will finally evaluate.
Analysis of Mobile Devices Network Communication Data
Abraham, Lukáš ; Bartík, Vladimír (referee) ; Burgetová, Ivana (advisor)
At the beginning, the work describes DNS and SSL/TLS protocols, it mainly deals with communication between devices using these protocols. Then we'll talk about data preprocessing and data cleaning. Furthermore, the thesis deals with basic data mining techniques such as data classification, association rules, information retrieval, regression analysis and cluster analysis. The next chapter we can read something about how to identify mobile devices on the network. We will evaluate data sets that contain collected data from communication between the above mentioned protocols, which will be used in the practical part. After that, we finally get to the design of a system for analyzing network communication data. We will describe the libraries, which we used and the entire system implementation. We will perform a large number of experiments, which we will finally evaluate.
Methodology and problems of data transformation and determine its importance in the integration of heterogeneous information sources
Bartoš, Ivan ; Papík, Richard (advisor) ; Dvořák, Jan (referee) ; Bureš, Miroslav (referee)
Methodology and issues of data transformation and its information value estimation during the integration of the heterogenous information sources PhDr. Ivan BARTOŠ Abstract This study focuses mainly on the data and information transformation issue. This topic is currently critical in several scientific and commercial areas. Information value, information quality and the quality of the source data differs between the various systems. This is not only due to the different topologies of the information sources but also because of its different understanding and a manner of storing the information describing the entity of the enterprise. Such information systems, respectively database systems in the scope of the thesis, could perform well as the stand alone systems. The issue appears in the moment when such heterogeneous systems are required to be integrated and the information shall be migrated between each other. The thesis is logically divided into four major parts based on these issues. The first part describes the methods that can be used to classify the data quality of the source system (the one to be integrated) from which the information can be extracted. Based on assumption of the common lack of project and system documentation hereby introduced methods can be used for such qualification even when the...
An analysis and implementation of Dashboards within SAP Business Objects 4.0/4.1
Kratochvíl, Tomáš ; Pour, Jan (advisor) ; Šedivá, Zuzana (referee)
The diploma thesis is focused on dashboards analysis and distribution and theirs implementation afterwards in SAP Dashboards and Web Intelligence tools. The main goal of this thesis is an analysis of dashboards for different area of company management according to chosen of architecture solution. Another goal of diploma thesis is to take into account the principles of dashboards within the company and it deals with indicator comparison as well. The author further defines data life cycle within Business Intelligence and deals with the decomposition of particular dashboard types in theoretical part. At the end of theory, it is included an important chapter from point of view data quality, data quality process and data quality improvement and an using of SAP Best Practices and KBA as well for BI tools published by SAP. The implementation of dashboards should be back up theoretical part. Implementation is divided into 3 chapters according to selected architecture, using multisource systems, SAP Infosets/Query and using Data Warehouse or Data Mart as an architecture solution for reporting purposes. The deep implementing section should be help reader to make his own opinion to different architecture, but especially difference in used BI tools within SAP Business Objects. At the end of each section regarding architecture and its solution, there are defined pros and cons.
Data quality management in small and medium enterprises
Zelený, Pavel ; Pour, Jan (advisor) ; Novotný, Ota (referee)
This diploma thesis deals with the data quality management. There are many tools and methodologies to support the data quality management even in Czech market but they are all only for large companies. Small and middle companies can't afford them because of high cost. The first goal of this thesis is to summarize principles of the methodologies and then on the base of the methodologies to suggest more simple methodology for small and middle companies. In the second part of thesis is created and adapted the methodology for a specific company. The first step is to choose the data area of interest in the company. Because of impossibility to buy a software tool to clean data, there are defined relatively simple rules which are base source to create cleaning scripts in SQL language. The scripts are used for automatic data cleaning. On the base of next analyze is decided what data should be cleaned manually. In the next step are described recommendations how to remove duplicities from the database. There is used a functionality of the company's production system. The last step of the methodology is to create a control mechanism which have to keep the required data quality in future. At the end of thesis is made a data research in four data sources. All these sources are from companies using the same production system. The reason of research is to present the overview of data quality and to help with decision about cleaning data in the companies also.
Master Data Integration hub - solution for company-wide consolidation of referrential data
Bartoš, Jan ; Slánský, David (advisor) ; Pour, Jan (referee)
In current information systems the requirement to integrate disparate applications into cohesive package is greatly accented. While well-established technologies facilitating functional and comunicational integration (ESB, message brokes, web services) already exist, tools and methodologies for continuous integration of disparate data sources on enterprise-wide level are still in development. Master Data Management (MDM) is a major approach in the area of data integration and referrential data management in particular. It encompasses the referrential data integration, data quality management and referrential data consolidation, metadata management, master data ownership, principle of accountability for master data and processes related to referrential data management. Thesis is focused on technological aspects of MDM implementation realized via introduction of centrallized repository for master data -- Master Data Integration Hub (MDI Hub). MDI Hub is an application which enables the integration and consolidation of referrential data stored in disparate systems and applications based on predefined workflows. It also handles the master data propagation back to source systems and provides services like dictionaries management and data quality monitoring. Thesis objective is to cover design and implementation aspects of MDI Hub, which forms the application part of MDM. In introduction the motivation for referrential data consolidation is discussed and list of techniques used in MDI Hub solution development is presented. The main part of thesis proposes the design of MDI Hub referrential architecture and suggests the activities performed in process of MDI Hub implementation. Thesis is based on information gained from specialized publications, on knowledge gathererd by delivering projects with companies Adastra and Ataccama and on co-workers know-how and experience. Most important contribution of thesis is comprehensive view on MDI Hub design and MDI Hub referrential architecture proposal. MDI Hub referrential architecture can serve as basis for particular MDI Hub implementation.
Data Quality, Data intagrity and Data Consolidation in BI
Smolík, Ondřej ; Pour, Jan (advisor) ; Zajíc, Ján (referee)
This thesis fights with the data quality in business intelligence. We present basic principles for building data warehouse to achieve the highest data quality. We also present some data clearing methods as deviation detection or name-address clearing. This work also deals with origin of erroneous data and prevention of their generation. In second part of this thesis we show presented methods and principles on real example of data warehouse and we suggest how to get sales data from our business partners or customers.
Data Quality and Effective Use of Registers of State Administration
Rut, Lukáš ; Chlapek, Dušan (advisor) ; Jankech, Pavel (referee)
This diploma thesis deals with registers of state administration in term of data quality. The main objective is to analyze the ways how to evaluate data quality and to apply appropriate method to data in business register. Analysis of possibilities of data cleansing and data quality improving and proposal of solution of found inaccuracy in business register is another objective. The last goal of this paper is to analyze approaches how to set identifier of persons and to choose suitable key for identification of persons in registers of state administration. The thesis is divided into several parts. The first one includes introduction into the sphere of registers of state administration. It closely analyzes several selected registers especially in terms of which data contain and how they are updated. Description of legislation changes, which will come into operation in the middle of year 2010, is great contribution of this part. Special attention is dedicated to the impact of these changes from data quality point of view. Next part deals with problems of legal and physical entities identifiers. This section contains possible solution how to identify entities in data from registers. Third part analyzes ways how to determine data quality. Method called data profiling is closely described and applied to extensive data quality analysis of business register. Correct metadata and information about incorrect data are the outputs of this analysis. The last chapter deals with possibilities how to solve data quality problems. There are proposed and compared three variations of solution. The paper as a whole represents compact material how to solve problems with effective using of data contained in registers of state administration. Nevertheless, proposed solutions and described approaches can be used in many other projects which deal with data quality.
The influence of etalons on the data qualitiy in companies
Bukovský, Radim ; Slánský, David (advisor) ; Pour, Jan (referee)
This thesis is dedicated to all persons who are interested in information of the data quality (especially the Data Cleansing). Everyone who wants to get to know more about etalons, the essential part of the data quality, is welcomed. The insight into everyday activities of people working not only on creating and administrating of etalons but also implementing them on particular projects is prepared for everyone. The reader will also gain detailed knowledge of how etalons could help companies with decreasing their costs and how etalons can make better impression to company's clients through the data quality.

Interested in being notified about new results for this query?
Subscribe to the RSS feed.